Decision Fusion on Boosting Ensembles

نویسندگان

  • Joaquín Torres-Sospedra
  • Carlos Hernández-Espinosa
  • Mercedes Fernández-Redondo
چکیده

Training an ensemble of neural networks is an interesting way to build a Multi-net System. One of the key factors to design an ensemble is how to combine the networks to give a single output. Although there are some important methods to build ensembles, Boosting is one of the most important ones. Most of methods based on Boosting use an specific combiner (Boosting Combiner). Although the Boosting combiner provides good results on boosting ensembles, the results of previouses papers show that the simple combiner Output Average can work better than the Boosting combiner. In this paper, we study the performance of sixteen different combination methods for ensembles previously trained with Adaptive Boosting and Average Boosting. The results show that the accuracy of the ensembles trained with these original boosting methods can be improved by using the appropriate alternative combiner.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Creating diversity in ensembles using artificial data

The diversity of an ensemble of classifiers is known to be an important factor in determining its generalization error. We present a new method for generating ensembles, Decorate (Diverse Ensemble Creation by Oppositional Relabeling of Artificial Training Examples), that directly constructs diverse hypotheses using additional artificially-constructed training examples. The technique is a simple...

متن کامل

An experimental study on diversity for bagging and boosting with linear classifiers

In classifier combination, it is believed that diverse ensembles have a better potential for improvement on the accuracy than nondiverse ensembles. We put this hypothesis to a test for two methods for building the ensembles: Bagging and Boosting, with two linear classifier models: the nearest mean classifier and the pseudo-Fisher linear discriminant classifier. To estimate diversity, we apply n...

متن کامل

Ensembles of decision trees based on imprecise probabilities and uncertainty measures

Please cite this article in press as: J. Abellán, E (2012), http://dx.doi.org/10.1016/j.inffus.2012.0 In this paper, we present an experimental comparison among different strategies for combining decision trees built by means of imprecise probabilities and uncertainty measures. It has been proven that the combination or fusion of the information obtained from several classifiers can improve the...

متن کامل

Boosting Lite - Handling Larger Datasets and Slower Base Classifiers

In this paper, we examine ensemble algorithms (Boosting Lite and Ivoting) that provide accuracy approximating a single classifier, but which require significantly fewer training examples. Such algorithms allow ensemble methods to operate on very large data sets or use very slow learning algorithms. Boosting Lite is compared with Ivoting, standard boosting, and building a single classifier. Comp...

متن کامل

Fusion of Topology Preserving Neural Networks

In this paper ensembles of self organizing NNs through fusion are introduced. In these ensembles not the output signals of the base learners are combined, but their architectures are properly merged. Merging algorithms for fusion and boosting-fusion-based ensembles of SOMs, GSOMs and NG networks are presented and positively evaluated on benchmarks from the UCI database.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008